Entropy-regularized 2-Wasserstein distance between Gaussian measures

نویسندگان

چکیده

Abstract Gaussian distributions are plentiful in applications dealing uncertainty quantification and diffusivity. They furthermore stand as important special cases for frameworks providing geometries probability measures, the resulting geometry on Gaussians is often expressible closed-form under frameworks. In this work, we study entropy-regularized 2-Wasserstein distance, by solutions distance interpolations between elements. Furthermore, provide a fixed-point characterization of population barycenter when restricted to manifold Gaussians, which allows computations through iteration algorithm. As consequence, results yield expressions 2-Sinkhorn divergence. change varying regularization magnitude, limiting vanishing infinite magnitudes, reconfirming well-known limits Sinkhorn Finally, illustrate with numerical study.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Entropy Distance between the Wiener and Stationary Gaussian Measures

Investigating the entropy distance between the Wiener measure,Wt0,τ , and stationary Gaussian measures, Qt0,τ on the space of continuous functions C[t0 − τ, t0 + τ ], we show that in some cases this distance can essentially be computed. This is done by explicitly computing a related quantity which in effect is a valid approximation of the entropy distance, provided it is sufficiently small; thi...

متن کامل

Wasserstein Geometry of Gaussian Measures

This paper concerns the Riemannian/Alexandrov geometry of Gaussian measures, from the view point of the L2-Wasserstein geometry. The space of Gaussian measures is of finite dimension, which allows to write down the explicit Riemannian metric which in turn induces the L2-Wasserstein distance. Moreover, its completion as a metric space provides a complete picture of the singular behavior of the L...

متن کامل

Sliced Wasserstein Distance for Learning Gaussian Mixture Models

Gaussian mixture models (GMM) are powerful parametric tools with many applications in machine learning and computer vision. Expectation maximization (EM) is the most popular algorithm for estimating the GMM parameters. However, EM guarantees only convergence to a stationary point of the log-likelihood function, which could be arbitrarily worse than the optimal solution. Inspired by the relation...

متن کامل

On Wasserstein Geometry of the Space of Gaussian Measures

Abstract. The space which consists of measures having finite second moment is an infinite dimensional metric space endowed with Wasserstein distance, while the space of Gaussian measures on Euclidean space is parameterized by mean and covariance matrices, hence a finite dimensional manifold. By restricting to the space of Gaussian measures inside the space of probability measures, we manage to ...

متن کامل

Evaluation of Distance Measures Between Gaussian Mixture Models of MFCCs

In music similarity and in the related task of genre classification, a distance measure between Gaussian mixture models is frequently needed. We present a comparison of the KullbackLeibler distance, the earth movers distance and the normalized L2 distance for this application. Although the normalized L2 distance was slightly inferior to the Kullback-Leibler distance with respect to classificati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Information geometry

سال: 2021

ISSN: ['2511-2481', '2511-249X']

DOI: https://doi.org/10.1007/s41884-021-00052-8